8,756 research outputs found
Estimating the COP Exchange Rate Volatility Smile and the Market Effect of Central Bank Interventions: A CHARN Approach
In this paper we estimated a volatility model for COP/US under two different samples, one containing the information before the “discretional interventions” started, and the other using the whole sample. We use a nonparametric approach to estimate the mean and “volatility smile” return functions using daily data. For the pre-interventions sample, we found a nonlinear expected return function and, surprisingly, a nonsymmetric “volatility smile”. These lack of linearity and symmetry are related to absolute returns above 1,5% and 1,0%, respectively. We also found that the “discretional interventions” did not shift the mean response function, but moved the expected returns along the line towards the required levels. In contrast, the “volatility smile” tends to increase in a non-symmetric way after accounting for “discretional interventions”. The Sep/29/2004 announcement does not seem to have had any effect on the expected conditional mean or variance functions, but the Dec/17/2004 announcement seems to be related to non-symmetric effects on the volatility smile. We concluded that the announcement of discretional intervention by the monetary authority was more efficient when time and amount were unannounced.Volatility Smile,
Geometric Objects: A Quality Index to Electromagnetic Energy Transfer Performance in Sustainable Smart Buildings
Sustainable smart buildings play an essential role in terms of more efficient energy.
However, these buildings as electric loads are affected by an important distortion in the current and
voltage waveforms caused by the increasing proliferation of nonlinear electronic devices. Overall,
buildings all around the world consume a significant amount of energy, which is about one-third of
the total primary energy resources. Optimization of the power transfer process of such amount of
energy is a crucial issue that needs specific tools to integrate energy-efficient behaviour throughout
the grid. When nonlinear loads are present, new capable ways of thinking are needed to consider
the effects of harmonics and related power components. In this manner, technology innovations are
necessary to update the power factor concept to a generalized total or a true one, where different
power components involved in it calculation, properly reflect each harmonic interaction. This work
addresses an innovative theory that applies the Poynting Vector philosophy via Geometric Algebra
to the electromagnetic energy transfer process providing a physical foundation. In this framework,
it is possible to analyse and detect the nature of disturbing loads in the exponential growth of
new globalized buildings and architectures in our era. This new insight is based on the concept
of geometric objects with different dimension: vector, bivector, trivector, multivector. Within this
paper, these objects are correlated with the electromagnetic quantities responsible for the energy flow
supplied to the most common loads in sustainable smart buildings. Besides, it must be considered
that these phenomena are characterized by a quality index multivector appropriate even for detecting
harmonic sources. A numerical example is used to illustrate the clear capabilities of the suggested
index when it applies to industrial loads for optimization of energy control systems and enhance
comfort management in smart sustainable buildings
14-N: Tratamiento de la información en medios de comunicación
[Resumen]Los medios de comunicación de masas cumplen con una función muy importante: la emisión de información a gran
escala. Sin embargo, los intereses económicos a los que están sujetos pueden condicionar el mensaje y, con ello, la conformación de
la opinión pública. En la presente investigación, mediante un estudio cuantitativo sobre dos grandes medios, se trata de determinar si
efectivamente la información está sometida a filtros que condicionan su calidad, así como el papel de la sociedad civil en la
comunicación. A partir de un hecho de amplia trascendencia, la huelga general del 14-N, los datos obtenidos se contrastan con los
de un periódico digital de planteamiento alternativo.[Abstract]The mass media fullfil a very important fonction: transmit informations on a large scale. However the economic
interests they have to represent could influence to condition the message and with it the compliance of the public opinion. In this
investigation, through a quantitative study about two important media, the purpose is to determine if the media purposely leaked
information which condition this quality, as well as the role of communication in the civil society. As from transcendant act is the
general strike of 14 N, the obtained information will be contrasted with a digital
newspaper and with alternative approaches.[Resumo] Os medios de comunicación de masas cumpren cunha función moi importante: a emisión de información a gran escala.
Nembargantes, os intereses económicos aos que están suxeitos poden condicionar a mensaxe e, con iso, a conformación da opinión
pública. Na presente investigación, mediante un estudo cuantitativo sobre dous grandes medios, trátase de determinar se
efectivamente a información está sometida a filtros que condicionan a súa calidade, así como o papel da sociedade civil na
comunicación. A partir dun feito de ampla trascendencia, a folga xeral do 14-N, os datos obtidos contrástanse cos dun xornal dixital
de plantexamento alternativo.Traballo fin de grao (UDC-SOC). Socioloxía. Curso 2012/201
Nanogeles de quitosano para la liberación controlada de polioxometalatos con aplicaciones biomédicas
[ES] Durante muchos años, la utilización de polímeros de origen natural se ha visto limitada al uso de materiales de tipo cotidiano como la lana, la seda o el caucho natural. No obstante, los biomateriales, tradicionalmente definidos como materiales de uso clínico, han presenciado un aumento en su grado de sofisticación
y estos son el objeto de estudio de muchas líneas de investigación relacionadas con la medicina, biotecnología, genética y farmacia.
Este desarrollo de nuevos biomateriales ha impulsado el uso de
polímeros biodegradables para aplicaciones biomédicas.
En este ámbito, los hidrogeles son de gran relevancia
Efficiency and Reliability in Bringing AI into Transport and Smart Cities Solutions
capacity and the low cost of the Cloud have facilitated the development of new, powerful
algorithms. The efficiency of these algorithms in Big Data processing, Deep Learning and
Convolutional Networks is transforming the way we work and is opening new horizons. Thanks
to them, we can now analyse data and obtain unimaginable solutions to today’s problems.
Nevertheless, our success is not entirely based on algorithms, it also comes from our ability to
follow our “gut” when choosing the best combination of algorithms for an intelligent artefact.
Their development involves the use of both connectionist and symbolic systems, that is to say
data and knowledge. Moreover, it is necessary to work with both historical and real-time data. It
is also important to consider development time, costs and the ability to create systems that will
interact with their environment, will connect with the objects that surround them and will
manage the data they obtain in a reliable manner.
In this keynote, the evolution of intelligent computer systems will be examined, especially that
of convolutional networks. The need for human capital will be discussed, as well as the need to
follow one’s “gut instinct” in problem-solving.
Furthermore, the importance of IoT and Blockchain in the development of intelligent systems
will be analysed and it will be shown how tools like "Deep Intelligence" make it possible to create
computer systems efficiently and effectively. "Smart" infrastructures need to incorporate all
added-value resources so they can offer useful services to the society, while reducing costs,
ensuring reliability and improving the quality of life of the citizens. The combination of AI with
IoT and with blockchain offers a world of possibilities and opportunities.
The development of transport, smart cities, urbanizations and leisure areas can be improved
through the use of distributed intelligent computer systems. In this regard, edge platforms or fog
computing help increase efficiency, reduce network latency, improve security and bring
intelligence to the edge of the network, the sensors, users and the environment.
Several use cases of intelligent systems will be presented, and it will be analysed how the
processes of implementation and use have been optimized by means of different tools
Piezoelectric transducer design optimization
To respond to the design of a torsion sensor into mechanical ultrasonic tissue applications, it is necessary to use FEM Finite Element Models. Through a simplified analytical model of torsion transducer,
we determine the resonant frequency for a torque transducer ultrasonic waves. It is computationally validated. More specifically the idea is to refine and optimize the design to be applied to the detection of preterm birth identifying changes in the consistency of the cervical tissue through the shear modulus measurements. Therefore, a model
with a disk transmitter and a ring receiver for easy accessibility was performed and sensitivity analysis to find the range of optimal design values with this application was calculated. Therefore, it is neces-
sary to optimize the piezoelectric transducer model design regarding two types of parameters. On one hand the design parameters, and on the other hand the model parameters that characterize the specimen.
The forward problem is obtained by performing a three-dimensional finite element simulation. The experimental measurements are simulated by adding a gaussian noise as a percentage of the RMS of the numerically predicted signals. In addition, a semi-analytical estimate of the probability of detection (POD) is developed to provide
a rational criterion to optimize the experimental design.Para responder al diseño de un sensor de torsión con aplicaciones a la mecánica tisular ultrasónica, es necesario el uso de modelos de elementos finitos FEM como procedimiento directo. A través de un
modelo simplificado de análisis de transductor de torsión, se determina la frecuencia de resonancia que se valida computacionalmente.
Más específicamente, la idea es refinar y optimizar el diseño que debe aplicarse a la detección de parto prematuro identificar los cambios en la consistencia del tejido del cuello uterino a través de medidas del módulo G . Por lo tanto, se elige un modelo con un disco transmisor y un anillo receptor para facilitar la accesibilidad en el diseño
y se realizó un análisis de sensibilidad para encontrar el rango de valores óptimos con esta aplicación. Para optimizar el diseño del modelo del transductor piezoeléctrico con respecto a dos tipos de
parámetros. Por un lado los parámetros de diseño, y por otra parte los parámetros del modelo que caracterizan la muestra. Las medi-
ciones experimentales se simulan mediante la adición de un ruido gaussiano como un porcentaje de la RMS de las señales predichas
numéricamente. Además, una estimación semi-analítica de la probabilidad de detección (POD) se ha desarrollado para proporcionar un criterio racional para optimizar el diseño experimental.Universidad de Granada. Departamento de Mecánica de Estructuras e Ingeniería Hidráulica. Máster Universitario en Estructuras, curso 2011-2012Este trabajo a sido financiado por el Ministerio de Educación a través de Dpi2010-1706
DeepTech - AI Models in Engineering Solutions
Artificial Intelligence revived in the last decade. The need for progress, the growing
processing capacity and the low cost of the Cloud have facilitated the development of new,
powerful algorithms. The efficiency of these algorithms in Big Data processing, Deep
Learning and Convolutional Networks is transforming the way we work and is opening new
horizons. Thanks to them, we can now analyse data and obtain unimaginable solutions to
today’s problems. Nevertheless, our success is not entirely based on algorithms, it also
comes from our ability to follow our “gut” when choosing the best combination of algorithms
for an intelligent artefact. It's about approaching engineering with a lot of knowledge and
tact. This involves the use of both connectionist and symbolic systems, and of having a full
understanding of the algorithms used. Moreover, to address today’s problems we must
work with both historical and real-time data. We must fully comprehend the problem, its
time evolution, as well as the relevance and implications of each piece of data, etc. It is also
important to consider development time, costs and the ability to create systems that will
interact with their environment, will connect with the objects that surround them and will
manage the data they obtain in a reliable manner
- …